Orthogonal Polynomials, Quadrature, and Approximation: Computational Methods and Software (in Matlab)

نویسنده

  • Walter Gautschi
چکیده

Orthogonal polynomials, unless they are classical, require special techniques for their computation. One of the central problems is to generate the coefficients in the basic three-term recurrence relation they are known to satisfy. There are two general approaches for doing this: methods based on moment information, and discretization methods. In the former, one develops algorithms that take as input given moments, or modified moments, of the underlying measure and produce as output the desired recurrence coefficients. In theory, these algorithms yield exact answers. In practice, owing to rounding errors, the results are potentially inaccurate depending on the numerical condition of the mapping from the given moments (or modified moments) to the recurrence coefficients. A study of related condition numbers is therefore of practical interest. In contrast to moment-based algorithms, discretization methods are basically approximate methods: one approximates the underlying inner product by a discrete inner product and takes the recurrence coefficients of the corresponding discrete orthogonal polynomials to approximate those of the desired orthogonal polynomials. Finding discretizations that yield satisfactory rates of convergence requires a certain amount of skill and creativity on the part of the user, although general-purpose discretizations are available if all else fails. Other interesting problems have as objective the computation of new orthogonal polynomials out of old ones. If the measure of the new 1 orthogonal polynomials is the measure of the old ones multiplied by a rational function, one talks about modification of orthogonal polynomials and modification algorithms that carry out the transition from the old to the new orthogonal polynomials. This enters into a circle of ideas already investigated by Christoffel in the 1850s, but effective algorithms have been obtained only very recently. They require the computation of Cauchy integrals of orthogonal polynomials — another interesting computational problem. In the 1960s, a new type of orthogonal polynomials emerged — the so-called Sobolev orthogonal polynomials — which are based on inner products involving derivatives. Although they present their own computational challenges, moment-based algorithms and discretization methods are still two of the main stocks of the trade. The computation of zeros of Sobolev orthogonal polynomials is of particular interest in practice. An important application of orthogonal polynomials is to quadrature, specifically quadrature rules of the highest algebraic degree of exactness. Foremost among them is the Gaussian quadrature rule and its close relatives, the Gauss–Radau and Gauss–Lobatto rules. More recent extensions are due to Kronrod, who inserts n+1 new nodes into a given n-point Gauss formula, again optimally with respect to degree of exactness, and to Turán, who allows derivative terms to appear in the quadrature sum. When integrating functions having poles outside the interval of integration, quadrature rules of polynomial/rational degree of exactness are of interest. Poles inside the interval of integration give rise to Cauchy principal value integrals, which pose computational problems of their own. Interpreting Gaussian quadrature sums in terms of matrices allows interesting applications to the computation of matrix functionals. In the realm of approximation, orthogonal polynomials, especially discrete ones, find use in curve fitting, e.g. in the least squares approximation of discrete data. This indeed is the problem in which orthogonal polynomials (in substance if not in name) first appeared in the 1850s in work of Chebyshev. Sobolev orthogonal polynomials also had their origin in least squares approximation, when one tries to fit simultaneously functions together with some of their derivatives. Physically motivated are approximations by spline functions that preserve as many moments as possible. Interestingly, these also are related to orthogonal polynomials via Gauss and generalized Gauss-type quadrature formulae. Slowly convergent series whose sum can be expressed

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ORTHOGONAL ZERO INTERPOLANTS AND APPLICATIONS

Orthogonal zero interpolants (OZI) are polynomials which interpolate the “zero-function” at a finite number of pre-assigned nodes and satisfy orthogonality condition. OZI’s can be constructed by the 3-term recurrence relation. These interpolants are found useful in the solution of constrained approximation problems and in the structure of Gauss-type quadrature rules. We present some theoretical...

متن کامل

On Multivariate Chebyshev Polynomials and Spectral Approximations on Triangles

In this paper we describe the use of multivariate Chebyshev polynomials in computing spectral derivations and Clenshaw–Curtis type quadratures. The multivariate Chebyshev polynomials give a spectrally accurate approximation of smooth multivariate functions. In particular we investigate polynomials derived from the A2 root system. We provide analytic formulas for the gradient and integral of A2 ...

متن کامل

A fractional type of the Chebyshev polynomials for approximation of solution of linear fractional differential equations

In this paper we introduce a type of fractional-order polynomials based on the classical Chebyshev polynomials of the second kind (FCSs). Also we construct the operational matrix of fractional derivative of order $ gamma $ in the Caputo for FCSs and show that this matrix with the Tau method are utilized to reduce the solution of some fractional-order differential equations.

متن کامل

Computing polynomials orthogonal with respect to densely oscillating and exponentially decaying weight functions and related integrals

Software (in Matlab) is developed for computing variable-precision recurrence coefficients for orthogonal polynomials with respect to the weight functions 1+sin(1/t), 1 + cos(1/t), e−1/t on [0, 1], as well as e−1/t−t on [0,∞] and e−1/t 2 −t on [−∞,∞]. Numerical examples are given involving Gauss quadrature relative to these weight functions.

متن کامل

Orthogonal polynomials: applications and computation

We give examples of problem areas in interpolation, approximation, and quadrature, that call for orthogonal polynomials not of the classical kind. We then discuss numerical methods of computing the respective Gauss-type quadrature rules and orthogonal polynomials. The basic task is to compute the coefficients in the three-term recurrence relation for the orthogonal polynomials. This can be done...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004